Search results for " Gesture recognition"
showing 5 items of 5 documents
Body Gestures and Spoken Sentences: A Novel Approach for Revealing User’s Emotions
2017
In the last decade, there has been a growing interest in emotion analysis research, which has been applied in several areas of computer science. Many authors have con- tributed to the development of emotion recognition algorithms, considering textual or non verbal data as input, such as facial expressions, gestures or, in the case of multi-modal emotion recognition, a combination of them. In this paper, we describe a method to detect emotions from gestures using the skeletal data obtained from Kinect-like devices as input, as well as a textual description of their meaning. The experimental results show that the correlation existing between body movements and spoken user sentence(s) can be u…
Implicit visual analysis in handedness recognition.
1998
In the present study, we addressed the problem of whether hand representations, derived from the control of hand gesture, are used in handedness recognition. Pictures of hands and fingers, assuming either common or uncommon postures, were presented to right-handed subjects, who were required to judge their handedness. In agreement with previous results (Parsons, 1987, 1994; Gentilucci, Daprati, & Gangitano, 1998), subjects recognized handedness through mental movement of their own hand in order to match the posture of the presented hand. This was proved by a control experiment of physical matching. The new finding was that presentation of common finger postures affected responses differ…
A gesture recognition framework for exploring museum exhibitions
2018
In this paper we present a gesture recognition framework for providing the visitors of a museum exhibition with a non intrusive interface for the multimedia enjoyment of digital contents. Early experiments were carried out at the Computer History Museum Exhibition of the University of Palermo.
Fundamentals of automated human gesture recognition using 3D integral imaging: a tutorial
2020
Automated human gesture recognition is receiving significant research interest, with applications ranging from novel acquisition techniques to algorithms, data processing, and classification methodologies. This tutorial presents an overview of the fundamental components and basics of the current 3D optical image acquisition technologies for gesture recognition, including the most promising algorithms. Experimental results illustrate some examples of 3D integral imaging, which are compared to conventional 2D optical imaging. Examples of classifying human gestures under normal and degraded conditions, such as low illumination and the presence of partial occlusions, are provided. This tutorial…
Automated Characterization of Mouth Activity for Stress and Anxiety Assessment
2016
International audience; Non-verbal information portrayed by human facial expression, apart from emotional cues also encompasses information relevant to psychophysical status. Mouth activities in particular have been found to correlate with signs of several conditions; depressed people smile less, while those in fatigue yawn more. In this paper, we present a semi-automated, robust and efficient algorithm for extracting mouth activity from video recordings based on Eigen-features and template-matching. The algorithm was evaluated for mouth openings and mouth deformations, on a minimum specification dataset of 640x480 resolution and 15 fps. The extracted features were the signals of mouth expa…